AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
High Sparsity Inference Acceleration

# High Sparsity Inference Acceleration

Minicpm S 1B Sft
Apache-2.0
MiniCPM-S-1B-sft is a 1B-parameter language model optimized with activation sparsity techniques, achieving high-sparsity inference acceleration through the ProSparse method while maintaining performance comparable to the original model.
Large Language Model Transformers Supports Multiple Languages
M
openbmb
169
10
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase